Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Time-delay cosmography is a powerful technique to constrain cosmological parameters, particularly the Hubble constant (H0). The TDCOSMO Collaboration is performing an ongoing analysis of lensed quasars to constrain cosmology using this method. In this work, we obtain constraints from the lensed quasar WGD 2038−4008 using new time-delay measurements and previous mass models by TDCOSMO. This is the first TDCOSMO lens to incorporate multiple lens modeling codes and the full time-delay covariance matrix into the cosmological inference. The models are fixed before the time delay is measured, and the analysis is performed blinded with respect to the cosmological parameters to prevent unconscious experimenter bias. We obtainDΔt = 1.68−0.38+0.40Gpc using two families of mass models, a power-law describing the total mass distribution, and a composite model of baryons and dark matter, although the composite model is disfavored due to kinematics constraints. In a flat ΛCDM cosmology, we constrain the Hubble constant to beH0 = 65−14+23km s−1Mpc−1. The dominant source of uncertainty comes from the time delays, due to the low variability of the quasar. Future long-term monitoring, especially in the era of theVera C. RubinObservatory’s Legacy Survey of Space and Time, could catch stronger quasar variability and further reduce the uncertainties. This system will be incorporated into an upcoming hierarchical analysis of the entire TDCOSMO sample, and improved time delays and spatially-resolved stellar kinematics could strengthen the constraints from this system in the future.more » « less
-
Strong-lensing time delays enable the measurement of the Hubble constant ( H 0 ) independently of other traditional methods. The main limitation to the precision of time-delay cosmography is mass-sheet degeneracy (MSD). Some of the previous TDCOSMO analyses broke the MSD by making standard assumptions about the mass density profile of the lens galaxy, reaching 2% precision from seven lenses. However, this approach could potentially bias the H 0 measurement or underestimate the errors. For this work, we broke the MSD for the first time using spatially resolved kinematics of the lens galaxy in RXJ1131−1231 obtained from the Keck Cosmic Web Imager spectroscopy, in combination with previously published time delay and lens models derived from Hubble Space Telescope imaging. This approach allowed us to robustly estimate H 0 , effectively implementing a maximally flexible mass model. Following a blind analysis, we estimated the angular diameter distance to the lens galaxy D d = 865 −81 +85 Mpc and the time-delay distance D Δt = 2180 −271 +472 Mpc, giving H 0 = 77.1 −7.1 +7.3 km s −1 Mpc −1 – for a flat Λ cold dark matter cosmology. The error budget accounts for all uncertainties, including the MSD inherent to the lens mass profile and line-of-sight effects, and those related to the mass–anisotropy degeneracy and projection effects. Our new measurement is in excellent agreement with those obtained in the past using standard simply parametrized mass profiles for this single system ( H 0 = 78.3 −3.3 +3.4 km s −1 Mpc −1 ) and for seven lenses ( H 0 = 74.2 −1.6 +1.6 km s −1 Mpc −1 ), or for seven lenses using single-aperture kinematics and the same maximally flexible models used by us ( H 0 = 73.3 −5.8 +5.8 km s −1 Mpc −1 ). This agreement corroborates the methodology of time-delay cosmography.more » « less
-
Abstract Imaging data is the principal observable required to use galaxy-scale strong lensing in a multitude of applications in extragalactic astrophysics and cosmology. In this paper, we develop Lensing Exposure Time Calculator (L ensing ETC; https://github.com/ajshajib/LensingETC ) to optimize the efficiency of telescope-time usage when planning multifilter imaging campaigns for galaxy-scale strong lenses. This tool simulates realistic data tailored to specified instrument characteristics and then automatically models them to assess the power of the data in constraining lens model parameters. We demonstrate a use case of this tool by optimizing a two-filter observing strategy (in the IR and ultraviolet-visual (UVIS)) within the limited exposure time per system allowed by a Hubble Space Telescope (HST) Snapshot program. We find that higher resolution is more advantageous to gain constraining power on the lensing observables, when there is a trade-off between signal-to-noise ratio and resolution; for example, between the UVIS and IR filters of the HST. We also find that, whereas a point-spread function (PSF) with sub-Nyquist sampling allows the sample mean for a model parameter to be robustly recovered for both galaxy–galaxy and point-source lensing systems, a sub-Nyquist-sampled PSF introduces a larger scatter than a Nyquist-sampled one in the deviation from the ground truth for point-source lens systems.more » « less
-
Candidate Periodically Variable Quasars from the Dark Energy Survey and the Sloan Digital Sky Surveynull (Ed.)Abstract Periodically variable quasars have been suggested as close binary supermassive black holes. We present a systematic search for periodic light curves in 625 spectroscopically confirmed quasars with a median redshift of 1.8 in a 4.6 deg2 overlapping region of the Dark Energy Survey Supernova (DES-SN) fields and the Sloan Digital Sky Survey Stripe 82 (SDSS-S82). Our sample has a unique 20-year long multi-color (griz) light curve enabled by combining DES-SN Y6 observations with archival SDSS-S82 data. The deep imaging allows us to search for periodic light curves in less luminous quasars (down to r ∼23.5 mag) powered by less massive black holes (with masses ≳ 108.5M⊙) at high redshift for the first time. We find five candidates with significant (at >99.74% single-frequency significance in at least two bands with a global p-value of ∼7 × 10−4–3× 10−3 accounting for the look-elsewhere effect) periodicity with observed periods of ∼3–5 years (i.e., 1–2 years in rest frame) having ∼4–6 cycles spanned by the observations. If all five candidates are periodically variable quasars, this translates into a detection rate of $${\sim }0.8^{+0.5}_{-0.3}$$% or $${\sim }1.1^{+0.7}_{-0.5}$$ quasar per deg2. Our detection rate is 4–80 times larger than those found by previous searches using shallower surveys over larger areas. This discrepancy is likely caused by differences in the quasar populations probed and the survey data qualities. We discuss implications on the future direct detection of low-frequency gravitational waves. Continued photometric monitoring will further assess the robustness and characteristics of these candidate periodic quasars to determine their physical origins.more » « less
An official website of the United States government
